Problems deciding on new Video Card for New Computer
Hello, just recently I've decided to buy a new computer, and I have already picked out one that is perfect for me
Only problem is, I can't really decide on a video card to purchase for it, as I've been told and read online that the video chip that's included isn't really all that great.
My friend, who knows a lot more about computers than I, has suggested something that's "overclocked" and I did find some that were like this but.. after that I can't really tell what's better than what when the prices are nearly all the same.
just because something is overclocked, doesn't mean it's any good. an overclocked 8600GTS and a stock HD3850 tends to be very close in price, but the HD3850 will annihilate it in every category.
unfortunately, that system you posted is not very upgradable. it uses an old chipset, it has a fairly weak processor by today's standard, and the power supply it uses won't be able to power a good graphic card.
usually, trying to upgrade a cheap prebuilt computer won't work, simply because the manufacturer use parts that won't have a lot of upgrading headroom. to them, they want you to buy another system instead of upgrading. if you want a system that's cheap and have exactly what you want, you're better off building it yourself, or pay someone to build it for you.
..Okay, so maybe the video card doesn't have to be the BEST, just something thats better than what this computer comes with.
And as for the computer not being very upgradable, that's okay, because its a hell of a lot better than the comp I'm using now :S
I personally recommend nVidia cards due to a wider range of compatibility with games an APIs. I've been an OpenGL programmer since the mid-90's and I can tell you that ATI has trouble with GL-based games to this day. They tend to have slightly faster clock speeds, but that gets trumped by having to run specific things through software.
If you only play Guild Wars, you'll be fine with that card, but if you plan on playing other games, you'll find very few that sport the ATI logo on the box, and many that display nVidia's logo. Oh and before somebody flames over the whole ATI versus nVidia, I own both ATI and nVidia-based systems, so this is a genuinely factual stand-point based on real-world experience, not some kid spouting off what he or she heard from others.
By the way, why are you buying a prebuilt if you know them on the inside? It's cheaper to build your own, and you generally get longer warranties on the internals.
for the OP's price range and system specs, he's limited to either the HD3600 series, or the geforce 8600GT/GTS. the HD3650 is generally the more powerful solutions, since the 8600 series are pretty much made of fail.
So you're sure that'll run fine on the comp I've chosen? Like the power supply will handle it?
It's quite possible that the power supply can handle it, but it's also possible that it won't. It depends a lot upon the quality of the power supply, and how much (if any) headroom it has. (That is, how much extra power is available beyond what the system normally needs).
Run the system for a while without the new card to make sure everything is working fine. Then install the new card - if the system starts to act a bit "flaky", (or doesn't start at all) you may want to get a larger power supply. Hopefully, you can get a standard power supply for it - I haven't heard of Gateway using proprietary connectors (a la Dell).
The 8600 series runs Oblivion on max detail in 1280x1024 here, so if it can't handle GW, which is much less intense, I'd like to know why.
Also, the power supply is incredibly weak. A 300W PS is just enought o push the CPU, much less anything else. You do realize that to determine whether or not you have enough CPU power is to simply add up the consumption under max load by your primary devices, namely the CPU and GPU. For example, if that CPU averages 300W under full load, you're maxed for now, and one or the other will starve when playing a demanding game.
My system has a P4 in it, that requires 350W under a full load. My 7800GS requires 300W under a full load. That's 650W of power, not counting optical drives, USB, and other devices. I have a 700W ToughPower PS in mine, and have had no problems with it. I can tell you from personal experience though, that running 450W PS in a system with a 350W CPU and a 250W GPU will result in burned molex connectors and a fried PS after about a year.
Get the right PS to begin with, no matter what CPU/GPU combo you chooose. It'll save you a TON of headaches in the long-run.
Thanks Quaker, I was trying to locate their specs and was having trouble. Still, that sounds weak, because if his CPU is only 300W, that means that the card in question only requires 100W of power? Normally they're 250W or more. I need the tech specs on those ATI cards, and I always have a heck of a time finding them when I need them.
Also, the power supply is incredibly weak. A 300W PS is just enought o push the CPU, much less anything else. You do realize that to determine whether or not you have enough CPU power is to simply add up the consumption under max load by your primary devices, namely the CPU and GPU. For example, if that CPU averages 300W under full load, you're maxed for now, and one or the other will starve when playing a demanding game.
My system has a P4 in it, that requires 350W under a full load. My 7800GS requires 300W under a full load. That's 650W of power, not counting optical drives, USB, and other devices. I have a 700W ToughPower PS in mine, and have had no problems with it. I can tell you from personal experience though, that running 450W PS in a system with a 350W CPU and a 250W GPU will result in burned molex connectors and a fried PS after about a year.
Your CPU most certainly does not use 350W, you crazy person. Most CPUs use around 50-100W. There's no need to go above a 500W power supply (in most cases) if you run a single GPU. The quality of the PSU is just as important as it's wattage. So congrats on wasting 200+W.
Edit: The 7800GS requires around 100/150W MAX at load.
Last edited by Dark Kal; Aug 20, 2008 at 09:31 AM // 09:31..
Also, the power supply is incredibly weak. A 300W PS is more than enough to push the CPU. For example, if that CPU averages 300W under full load (which it does not), you're maxed for now, and one or the other will starve when playing a demanding game.
My system has a P4 in it, that requires possibly 100W under a full load. My 7800GS requires 200W under a full load. That's 300W of power, not counting optical drives, USB, and other devices. I have a 700W ToughPower PS in mine, and have had no problems with it. I can tell you from personal experience though, that running 450W PS in a system with a 350W CPU (that don't exist) and a 250W GPU will result in burned molex connectors and a fried PS after about a year.
Syco Masato: Have you considered building your own? Or choose parts and get the local computer store to build it? This is a better method as it means you aren't paying for cheap generic shit. Maybe consider sending the specs of the $500 PC in moriz's thread to a local store for a quote? IN Australia stores charge AU$70-80 build fee. Even with this fee your getting a much better deal.
the HD3600 series does not require an extra molex connector, so it is limited to the power provided via the PCI-e slot. that's a maximum of 75W. it's probably a lot lower than that, even during gaming. the upcoming HD4670 will use around 59W max, so the HD3600 series (which is less power hungy) should be less than that.
and for the life of me, i can't imagine the rest of his system using 225W. his phenom processor is a low power version, and the rest of his components won't draw much power either. his 300W power supply should be safe.
just as a reference, AMD recomments a 450W powersupply for my HD4850, which is a ridiculously powerful (and power hungry) graphic card for its price. it draws a maximum of ~140W under full load. compared it to the HD3600 series, the 400W requirement seems pretty ridiculous.
ATI recommends a 400w power supply for the HD36xx series.
I would have added some more to this, but I was in a hurry.
I think, when ATI recommends a certain size power supply, they are probably taking into account the power demands of a typical system, plus some headroom to allow for extra components and reduce the number of Tech Support calls.
That cpu, btw, is listed as requiring 95 watts.
So, as I said before, depending upon the quality of the power supply, 300watts may be enough, but it would have to be a good power supply with a true 300watts. Without going into details, power supplies can be rated in various ways and under various conditions, so sometimes the rating is more theoretical than actual.
So, also as I said before, it may be worthwhile to try it with the 300watt supply, but be prepared to need a larger one. As with any such component, it should be relatively easy to get a good, barely used 350-450 watt PS from some computer hobbiest who's upgraded. I've got a couple of 350watts and a 535watt just kicking around gathering dust.
Here's an interesting link: http://www.extreme.outervision.com/PSUEngine
the Lite version can be used online for free. Plug in the components from the computer and see what you get.
Last edited by Quaker; Aug 20, 2008 at 03:51 PM // 15:51..
I would recommend you all elarn computers and get a degree before posting bogus information. For starters, I have experienced a PS failing due to running a 5700LE and a P4 at the same time under high demand. I actually burned up one of the connectors, the +12V for the CPU.
Oh, and have I mentioned that I do this kind of work for a living? You would be amazed at how often these weak power supplies cause failure or poor performance in systems. Ever hear of Perot Systems? Yeah, we don't know what we're doing.
Anyway, it's your choice. I have been in this field since the 80's, have been trained by Intel, AMD, nVidia, and others. My resume speaks for itself. You're welcome to take the advice of some HS kid if you want. It WILL be cheaper doing it this way, but the long-run costs will be greater.
^so says the person who claimed that a pentium 4 requires 350W by itself. you can flaunt your supposed qualifications all you want, but it all evaporates as soon as you post something ridiculous like that.